Total variation error bounds for geometric approximation
نویسندگان
چکیده
منابع مشابه
Total variation error bounds for geometric approximation
We develop a new formulation of Stein’s method to obtain computable upper bounds on the total variation distance between the geometric distribution and a distribution of interest. Our framework reduces the problem to the construction of a coupling between the original distribution and the “discrete equilibrium” distribution from renewal theory. We illustrate the approach in four nontrivial exam...
متن کاملBounds for Approximation in Total Variation Distance by Quantum Circuits
It was recently shown that for reasonable notions of approximation of states and functions by quantum circuits, almost all states and functions are exponentially hard to approximate 5]. The bounds obtained are asymptotically tight except for the one based on total variation distance (TVD). TVD is the most relevant metric for the performance of a quantum circuit. In this paper we obtain asymptot...
متن کاملBounds for Approximation in Total Variation Distance by Quantum Circuits
It was recently shown that for reasonable notions of approximation of states and functions by quantum circuits, almost all states and functions are exponentially hard to approximate 5]. The bounds obtained are asymptotically tight except for the one based on total variation distance (TVD). TVD is the most relevant metric for the performance of a quantum circuit. In this paper we obtain asymptot...
متن کاملOn the bounds in Poisson approximation for independent geometric distributed random variables
The main purpose of this note is to establish some bounds in Poisson approximation for row-wise arrays of independent geometric distributed random variables using the operator method. Some results related to random sums of independent geometric distributed random variables are also investigated.
متن کاملError Bounds for Approximation with Neural Networks
In this paper we prove convergence rates for the problem of approximating functions f by neural networks and similar constructions. We show that the rates are the better the smoother the activation functions are, provided that f satisses an integral representation. We give error bounds not only in Hilbert spaces but in general Sobolev spaces W m;r ((). Finally, we apply our results to a class o...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Bernoulli
سال: 2013
ISSN: 1350-7265
DOI: 10.3150/11-bej406